A Kullback-Leibler View of Maximum Entropy and Maximum Log-Probability Methods
نویسندگان
چکیده
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback–Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.
منابع مشابه
Quasi-continuous maximum entropy distribution approximation with kernel density
This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a nonparametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In a second step it is shown, how boundary conditions can be included...
متن کاملSome statistical inferences on the upper record of Lomax distribution
In this paper, we investigate some inferential properties of the upper record Lomax distribution. Also, we will estimate the upper record of the Lomax distribution parameters using methods, Moment (MME), Maximum Likelihood (MLE), Kullback-Leibler Divergence of the Survival function (DLS) and Baysian. Finally, we will compare these methods using the Monte Carlo simulation.
متن کاملEntropy Methods in Random Motion
We analyze a contrasting dynamical behavior of Gibbs–Shannon and conditional Kullback-Leibler entropies, induced by time-evolution of continuous probability distributions. The question of predominantly purposedependent entropy definition for non-equilibriummodel systems is addressed. The conditional Kullback–Leibler entropy is often believed to properly capture physical features of an asymptoti...
متن کاملEntropy and Divergence Associated with Power Function and the Statistical Application
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performanc...
متن کاملMaximum Entropy and Maximum Probability
Sanov’s Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Pólya Eggenberger urn sampling scheme, giving the Pólya divergence and the Pólya extension to the Maximum Relative Entropy (MaxEnt) method. Pólya MaxEnt includes the standard MaxEnt as a special case. The universality of standard MaxEnt advocated by an axiomatic approach to inference for inverse problems i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 19 شماره
صفحات -
تاریخ انتشار 2017